Goto

Collaborating Authors

 hyper parameter optimization


Enhancing Machine Learning Model Performance with Hyper Parameter Optimization: A Comparative Study

Erden, Caner, Demir, Halil Ibrahim, Kökçam, Abdullah Hulusi

arXiv.org Artificial Intelligence

One of the most critical issues in machine learning is the selection of appropriate hyper parameters for training models. Machine learning models may be able to reach the best training performance and may increase the ability to generalize using hyper parameter optimization (HPO) techniques. HPO is a popular topic that artificial intelligence studies have focused on recently and has attracted increasing interest. While the traditional methods developed for HPO include exhaustive search, grid search, random search, and Bayesian optimization; meta-heuristic algorithms are also employed as more advanced methods. Meta-heuristic algorithms search for the solution space where the solutions converge to the best combination to solve a specific problem. These algorithms test various scenarios and evaluate the results to select the best-performing combinations. In this study, classical methods, such as grid, random search and Bayesian optimization, and population-based algorithms, such as genetic algorithms and particle swarm optimization, are discussed in terms of the HPO. The use of related search algorithms is explained together with Python programming codes developed on packages such as Scikit-learn, Sklearn Genetic, and Optuna. The performance of the search algorithms is compared on a sample data set, and according to the results, the particle swarm optimization algorithm has outperformed the other algorithms.


How to Win a Kaggle Competition with Hyper Parameter Optimization

#artificialintelligence

In this blog post we highlight some of the key takeaways from David Austin's presentation on how to supercharge a 1st place Kaggle solution to higher performance. David Austin is a Senior Principal Artificial Intelligence Engineer at Intel working on industrial applications within the Internet of Things space. In his spare time, he spends, in his own words, way too much time participating in Kaggle competitions and has since 2018 held the title of grandmaster. In the presentation David Austin walks though the Iceberg Classifier Challenge, where the participants are asked to classify radar images into either icebergs or ships to improve safety at sea. At the time of the Iceberg Classifier Challenge it was the computer vision challenge with the most participants ever on Kaggle.


Top 5 AI Articles of February 2022 Every Data Scientist Should Read

#artificialintelligence

I explain Artificial Intelligence terms and news to non-experts. Here are the five best HackerNoon articles related to artificial intelligence in February. I hope they will help you learn more about machine learning this year. Note that the five articles you will see were curated by myself amongst hundred of other super interesting ones that you might enjoy even more. So please feel free to look at the AI tag on HackerNoon and keep learning!